Geometric mean of probability measures and geodesics of Fisher information metric

نویسندگان

چکیده

The space of all probability measures having positive density function on a connected compact smooth manifold $M$, denoted by $\mathcal{P}(M)$, carries the Fisher information metric $G$. We define geometric mean aid which we investigate geometry equipped with show that geodesic segment joining arbitrary $\mu_1$ and $\mu_2$ is expressed using normalized its endpoints. As an application, any two points $\mathcal{P}(M)$ can be joined unique geodesic. Moreover, prove $\ell$ defined $\ell(\mu_1, \mu_2):=2\arccos\int_M \sqrt{p_1\,p_2}\,d\lambda$, $\mu_i=p_i\,\lambda$, $i=1,2$ gives Riemannian distance $\mathcal{P}(M)$. It shown geodesics are minimal.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probability Density Functions from the Fisher Information Metric

We show a general relation between the spatially disjoint product of probability density functions and the sum of their Fisher information metric tensors. We then utilise this result to give a method for constructing the probability density functions for an arbitrary Riemannian Fisher information metric tensor. We note further that this construction is extremely unconstrained, depending only on...

متن کامل

Geometric Mean Metric Learning

We revisit the task of learning a Euclidean metric from data. We approach this problem from first principles and formulate it as a surprisingly simple optimization problem. Indeed, our formulation even admits a closed form solution. This solution possesses several very attractive properties: (i) an innate geometric appeal through the Riemannian geometry of positive definite matrices; (ii) ease ...

متن کامل

asymptotic fisher information in order statistics of geometric distribution

in this paper, the geometric distribution is considered. the means, variances, and covariances of its order statistics are derived. the fisher information in any set of order statistics in any distribution can be represented as a sum of fisher information in at most two order statistics. it is shown that, for the geometric distribution, it can be further simplified to a sum of fisher informatio...

متن کامل

Generalized information-entropy measures and Fisher information

We show how Fisher’s information already known particular character as the fundamental information geometric object which plays the role of a metric tensor for a statistical differential manifold, can be derived in a relatively easy manner through the direct application of a generalized logarithm and exponential formalism to generalized information-entropy measures. We shall first shortly descr...

متن کامل

Dynamics of the Fisher information metric.

We present a method to generate probability distributions that correspond to metrics obeying partial differential equations generated by extremizing a functional J [g(mu nu) (theta(i)) ] , where g(mu nu) (theta(i)) is the Fisher metric. We postulate that this functional of the dynamical variable g(mu nu) (theta(i)) is stationary with respect to small variations of these variables. Our approach ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematische Nachrichten

سال: 2023

ISSN: ['1522-2616', '0025-584X']

DOI: https://doi.org/10.1002/mana.202000167